185 research outputs found

    Improved data association and occlusion handling for vision-based people tracking by mobile robots

    Get PDF
    This paper presents an approach for tracking multiple persons using a combination of colour and thermal vision sensors on a mobile robot. First, an adaptive colour model is incorporated into the measurement model of the tracker. Second, a new approach for detecting occlusions is introduced, using a machine learning classifier for pairwise comparison of persons (classifying which one is in front of the other). Third, explicit occlusion handling is then incorporated into the tracker

    Smelling Nano Aerial Vehicle for Gas Source Localization and Mapping

    Get PDF
    This paper describes the development and validation of the currently smallest aerial platform with olfaction capabilities. The developed Smelling Nano Aerial Vehicle (SNAV) is based on a lightweight commercial nano-quadcopter (27 g) equipped with a custom gas sensing board that can host up to two in situ metal oxide semiconductor (MOX) gas sensors. Due to its small form-factor, the SNAV is not a hazard for humans, enabling its use in public areas or inside buildings. It can autonomously carry out gas sensing missions of hazardous environments inaccessible to terrestrial robots and bigger drones, for example searching for victims and hazardous gas leaks inside pockets that form within the wreckage of collapsed buildings in the aftermath of an earthquake or explosion. The first contribution of this work is assessing the impact of the nano-propellers on the MOX sensor signals at different distances to a gas source. A second contribution is adapting the 'bout' detection algorithm, proposed by Schmuker et al. (2016) to extract specific features from the derivative of the MOX sensor response, for real-time operation. The third and main contribution is the experimental validation of the SNAV for gas source localization (GSL) and mapping in a large indoor environment (160 m²) with a gas source placed in challenging positions for the drone, for example hidden in the ceiling of the room or inside a power outlet box. Two GSL strategies are compared, one based on the instantaneous gas sensor response and the other one based on the bout frequency. From the measurements collected (in motion) along a predefined sweeping path we built (in less than 3 min) a 3D map of the gas distribution and identified the most likely source location. Using the bout frequency yielded on average a higher localization accuracy than using the instantaneous gas sensor response (1.38 m versus 2.05 m error), however accurate tuning of an additional parameter (the noise threshold) is required in the former case. The main conclusion of this paper is that a nano-drone has the potential to perform gas sensing tasks in complex environments

    Lidar-level localization with radar? The CFEAR approach to accurate, fast and robust large-scale radar odometry in diverse environments

    Full text link
    This paper presents an accurate, highly efficient, and learning-free method for large-scale odometry estimation using spinning radar, empirically found to generalize well across very diverse environments -- outdoors, from urban to woodland, and indoors in warehouses and mines - without changing parameters. Our method integrates motion compensation within a sweep with one-to-many scan registration that minimizes distances between nearby oriented surface points and mitigates outliers with a robust loss function. Extending our previous approach CFEAR, we present an in-depth investigation on a wider range of data sets, quantifying the importance of filtering, resolution, registration cost and loss functions, keyframe history, and motion compensation. We present a new solving strategy and configuration that overcomes previous issues with sparsity and bias, and improves our state-of-the-art by 38%, thus, surprisingly, outperforming radar SLAM and approaching lidar SLAM. The most accurate configuration achieves 1.09% error at 5Hz on the Oxford benchmark, and the fastest achieves 1.79% error at 160Hz.Comment: Accepted for publication in Transactions on Robotics. Edited 2022-11-07: Updated affiliation and citatio

    Mobile Robots for Localizing Gas Emission Sources on Landfill Sites: Is Bio-Inspiration the Way to Go?

    Get PDF
    Roboticists often take inspiration from animals for designing sensors, actuators, or algorithms that control the behavior of robots. Bio-inspiration is motivated with the uncanny ability of animals to solve complex tasks like recognizing and manipulating objects, walking on uneven terrains, or navigating to the source of an odor plume. In particular the task of tracking an odor plume up to its source has nearly exclusively been addressed using biologically inspired algorithms and robots have been developed, for example, to mimic the behavior of moths, dung beetles, or lobsters. In this paper we argue that biomimetic approaches to gas source localization are of limited use, primarily because animals differ fundamentally in their sensing and actuation capabilities from state-of-the-art gas-sensitive mobile robots. To support our claim, we compare actuation and chemical sensing available to mobile robots to the corresponding capabilities of moths. We further characterize airflow and chemosensor measurements obtained with three different robot platforms (two wheeled robots and one flying micro-drone) in four prototypical environments and show that the assumption of a constant and unidirectional airflow, which is the basis of many gas source localization approaches, is usually far from being valid. This analysis should help to identify how underlying principles, which govern the gas source tracking behavior of animals, can be usefully “translated” into gas source localization approaches that fully take into account the capabilities of mobile robots. We also describe the requirements for a reference application, monitoring of gas emissions at landfill sites with mobile robots, and discuss an engineered gas source localization approach based on statistics as an alternative to biologically inspired algorithms

    Fusion of aerial images and sensor data from a ground vehicle for improved semantic mapping

    Get PDF
    This work investigates the use of semantic information to link ground level occupancy maps and aerial images. A ground level semantic map, which shows open ground and indicates the probability of cells being occupied by walls of buildings, is obtained by a mobile robot equipped with an omnidirectional camera, GPS and a laser range finder. This semantic information is used for local and global segmentation of an aerial image. The result is a map where the semantic information has been extended beyond the range of the robot sensors and predicts where the mobile robot can find buildings and potentially driveable ground

    Data association and occlusion handling for vision-based people tracking by mobile robots

    Get PDF
    This paper presents an approach for tracking multiple persons on a mobile robot with a combination of colour and thermal vision sensors, using several new techniques. First, an adaptive colour model is incorporated into the measurement model of the tracker. Second, a new approach for detecting occlusions is introduced, using a machine learning classifier for pairwise comparison of persons (classifying which one is in front of the other). Third, explicit occlusion handling is incorporated into the tracker. The paper presents a comprehensive, quantitative evaluation of the whole system and its different components using several real world data sets
    corecore